Anomaly detection based on probability density function with Kullback-Leibler divergence

نویسندگان

  • Wei Wang
  • Baoju Zhang
  • Dan Wang
  • Yu Jiang
  • Shan Qin
  • Lei Xue
چکیده

Anomaly detection is a popular problem in many fields. We investigate an anomaly detection method based on probability density function (PDF) of different status. The constructed PDF only require few training data based on Kullback–Leibler Divergence method and small signal assumption. The measurement matrix was deduced according to principal component analysis (PCA). And the statistical detection indicator was set up under iid Gaussian Noise background. The performance of the proposed anomaly detection method was tested with through wall human detection experiments. The results showed that the proposed method could detection human being for brick wall and gypsum wall, but had unremarkable results for concrete wall. & 2016 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Kullback-Leibler distance-based enhanced detection of incipient anomalies

Accurate and effective anomaly detection and diagnosis of modern engineering systems by monitoring processes ensure reliability and safety of a product while maintaining desired quality. In this paper, an innovative method based on Kullback-Leibler divergence for detecting incipient anomalies in highly correlated multivariate data is presented. We use a partial least square (PLS) method as a mo...

متن کامل

Maximally Divergent Intervals for Anomaly Detection

We present new methods for batch anomaly detection in multivariate time series. Our methods are based on maximizing the Kullback-Leibler divergence between the data distribution within and outside an interval of the time series. An empirical analysis shows the benefits of our algorithms compared to methods that treat each time step independently from each other without optimizing with respect t...

متن کامل

Kullback-Leibler Divergence Measurement for Clustering Based On Probability Distribution Similarity

Clustering on Distribution measurement is an essential task in mining methodology. The previous methods extend traditional partitioning based clustering methods like k-means and density based clustering methods like DBSCAN rely on geometric measurements between objects. The probability distributions have not been considered in measuring distance similarity between objects. In this paper, object...

متن کامل

Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization

In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimisations after the prediction and update ste...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Signal Processing

دوره 126  شماره 

صفحات  -

تاریخ انتشار 2016